A Novel Mixed-Attribute Fusion-Based Naive Bayesian Classifier
نویسندگان
چکیده
The Naive Bayesian classifier (NBC) is a well-known classification model that has simple structure, low training complexity, excellent scalability, and good performances. However, the NBC two key limitations: (1) it built upon strong assumption condition attributes are independent, which often does not hold in real-life, (2) handle continuous well. To overcome these limitations, this paper presents novel approach for construction, called mixed-attribute fusion-based (MAF-NBC). It alleviates aforementioned limitations by relying on fusion mechanism with an improved autoencoder neural network construction. MAF-NBC transforms original mixed of data set into series encoded maximum independence as pre-processing step. guarantee generation useful attributes, efficient objective function designed to optimize weights considering both encoding error attribute’s dependence. A persuasive experiments was conducted validate feasibility, rationality, effectiveness approach. Results demonstrate superior performance than eight state-of-the-art algorithms, namely discretization-based (Dis-NBC), flexible naive Bayes (FNB), tree-augmented (TAN) Bayes, averaged one-dependent estimator (AODE), hidden (HNB), deep feature weighting (DFW-NBC), correlation-based filter (CFW-NBC), independent component analysis-based (ICA-NBC).
منابع مشابه
Nomograms for Visualization of Naive Bayesian Classifier
Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the propos...
متن کاملImproving Naive Bayesian Classifier by Discriminative Training
Discriminative classifiers such as Support Vector Machines (SVM) directly learn a discriminant function or a posterior probability model to perform classification. On the other hand, generative classifiers often learn a joint probability model and then use the Bayes rule to construct a posterior classifier. In general, generative classifiers are not as accurate as discriminative classifiers. Ho...
متن کاملExtended Naive Bayes classifier for mixed data
Naive Bayes induction algorithm is very popular in classification field. Traditional method for dealing with numeric data is to discrete numeric attributes data into symbols. The difference of distinct discredited criteria has significant effect on performance. Moreover, several researches had recently employed the normal distribution to handle numeric data, but using only one value to estimate...
متن کاملVoting Principle Based on Nearest kernel classifier and Naive Bayesian classifier
This paper presented a voting principle based on multiple classifiers. This voting principle was based on the naïve Bayesian classification algorithm and a new method based on nearest to class kernel classifier that was proposed. The recognition ability of each classifier to each sample is not the same. A model of each classifier was obtained by the training on the train data, which acts as bas...
متن کاملA Heterogeneous Naive-Bayesian Classifier for Relational Databases
© A Heterogeneous Naive-Bayesian Classifier for Relational Databases Geetha Manjunath, M Narasimha Murty, Dinkar Sitaram HP Laboratories HPL-2009-225 Relational databases, Classification, Data Mining, RDF Most enterprise data is distributed in multiple relational databases with expert-designed schema. Application of single-table data mining techniques to distributed relational data not only inc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied sciences
سال: 2022
ISSN: ['2076-3417']
DOI: https://doi.org/10.3390/app122010443